35 research outputs found

    A Theoretical Model for the Extraction and Refinement of Natural Resources

    Get PDF
    The modelling of production in microeconomics has been the subject of heated debate. The controversial issues include the substitutability between production inputs, the role of time and the economic consequences of irreversibility in the production process. A case in point is the use of Cobb-Douglas type production functions. This approach completely ignores the physical process underlying the production of a good. We examine these issues in the context of the production of a basic commodity (such as copper or aluminium). We model the extraction and the refinement of a valuable substance which is mixed with waste material, in a way which is fully consistent with the physical constraints of the process. The resulting analytical description of production unambiguously reveals that perfect substitutability between production inputs fails if a corrected thermodynamic approach is used. We analyze the equilibrium pricing of a commodity extracted in an irreversible way. The thermodynamic model allows for the calculation of the ā€energy yieldā€ (energy return on energy invested) of production alongside a financial (real) return in a two-period investment decision. The two investment criteria correspond in our economy to a different choice of numeraire and means of payment and corresponding views of the value of energy resources. Under an energy numeraire, energy resources will naturally be used in a more parsimonious way

    Measuring Industry Relatedness and Corporate Coherence

    Get PDF
    Since the seminal work of Teece et al. (1994) firm diversification has been found to be a non-random process. The hidden deterministic nature of the diversification patterns is usually detected comparing expected (under a null hypothesys) and actual values of some statistics. Nevertheless the standard approach presents two big drawbacks, leaving unanswered several issues. First, using the observed value of a statistics provides noisy and nonhomogeneous estimates and second, the expected values are computed in a specific and privileged null hypothesis that implies spurious random effects. We show that using Monte Carlo p-scores as measure of relatedness provides cleaner and homogeneous estimates. Using the NBER database on corporate patents we investigate the effect of assuming different null hypotheses, from the less unconstrained to the fully constrained, revealing that new features in firm diversification patterns can be catched if random artifacts are ruled out.corporate coherence; relatedness; null model analysis; patent data

    Assessing systemic risk due to fire sales spillover through maximum entropy network reconstruction

    Full text link
    Assessing systemic risk in financial markets is of great importance but it often requires data that are unavailable or available at a very low frequency. For this reason, systemic risk assessment with partial information is potentially very useful for regulators and other stakeholders. In this paper we consider systemic risk due to fire sales spillover and portfolio rebalancing by using the risk metrics defined by Greenwood et al. (2015). By using the Maximum Entropy principle we propose a method to assess aggregated and single bank's systemicness and vulnerability and to statistically test for a change in these variables when only the information on the size of each bank and the capitalization of the investment assets are available. We prove the effectiveness of our method on 2001-2013 quarterly data of US banks for which portfolio composition is available.Comment: 36 pages, 6 figures, Accepted on Journal of Economic Dynamics and Contro

    Volatility Forecasting: The Jumps Do Matter

    Get PDF
    This study reconsiders the role of jumps for volatility forecasting by showing that jumps have a positive and mostly significant impact on future volatility. This result becomes apparent once volatility is correctly separated into its continuous and discontinuous component. To this purpose, we introduce the concept of threshold multipower variation (TMPV), which is based on the joint use of bipower variation and threshold estimation. With respect to alternative methods, our TMPV estimator provides less biased and robust estimates of the continuous quadratic variation and jumps. This technique also provides a new test for jump detection which has substantially more power than traditional tests. We use this separation to forecast volatility by employing an heterogeneous autoregressive (HAR) model which is suitable to parsimoniously model long memory in realized volatility time series. Empirical analysis shows that the proposed techniques improve significantly the accuracy of volatility forecasts for the S&P500 index, single stocks and US bond yields, especially in periods following the occurrence of a jump.volatility forecasting, jumps, bipower variation, threshold estimation, stock, bond

    Volatility forecasting: the jumps do matter

    Get PDF
    This study reconsiders the role of jumps for volatility forecasting by showing that jumps have positive and mostly significant impact on future volatility. This result becomes apparent once volatility is correctly separated into its continuous and discontinuous component. To this purpose, we introduce the concept of threshold multipower variation (TMPV), which is based on the joint use of bipower variation and threshold estimation. With respect to alternative methods, our TMPV estimator provides less biased and robust estimates of the continuous quadratic variation and jumps. This technique also provides a new test for jump detection which has substantially more power than traditional tests. We use this separation to forecast volatility by employing an heterogeneous autoregressive (HAR) model which is suitable to parsimoniously model long memory in realized volatility time series. Empirical analysis shows that the proposed techniques improve significantly the accuracy of volatility forecasts for the S&P500 index, single stocks and US bond yields, especially in periods following the occurrence of a jumpvolatility forecasting, jumps, bipower variation, threshold estimation, stock, bond

    Threshold Bipower Variation and the Impact of Jumps on Volatility Forecasting

    Get PDF
    This study reconsiders the role of jumps for volatility forecasting by showing that jumps have a positive and mostly significant impact on future volatility. This result becomes apparent once volatility is separated into its continuous and discontinuous component using estimators which are not only consistent, but also scarcely plagued by small-sample bias. To this purpose, we introduce the concept of threshold bipower variation, which is based on the joint use of bipower variation and threshold estimation. We show that its generalization (threshold multipower vari- ation) admits a feasible central limit theorem in the presence of jumps and provides less biased estimates, with respect to the standard multipower variation, of the continuous quadratic varia- tion in finite samples. We further provide a new test for jump detection which has substantially more power than tests based on multipower variation. Empirical analysis (on the S&P500 index, individual stocks and US bond yields) shows that the proposed techniques improve significantly the accuracy of volatility forecasts especially in periods following the occurrence of a jump.volatility estimation, jump detection, volatility forecasting, threshold estimation, financial markets

    Theoretical and Empirical Essays on the Dynamics of Financial and Energy Markets

    Get PDF
    This thesis is inspired by two main lines of research. Topics are analyzed in Chapters 1, 3, 4, 5 and 6. Chapter 2 is devoted to help the reader unfamiliar with the concepts of measure theory and stochastic processes. The first line of research is dedicated to highlight a drawback of the standard economic equilibrium model. We start from a question mainly raised by the ecological problem: is the economic equilibrium consistent with the physical world? The answer seems to be negative. Economic equilibrium theory estab- lishes the optimal level of production and consumption of goods. Consumption is, in fact, a social issue. It depends on what consumers prefer for their own utility. For this reason consumption is not directly related to the laws of physics. However production is unavoidably linked with a physical process: the ther- modynamic transformation of basic commodities in elaborated one, useful for consumption. Despite this fact most part of economic models, in the mainstream literature, completely neglect the thermodynamic cycles hidden in every production process. In the last two decades the ecological problem has gained attention over the scientific community, focusing on the role of thermodynamic efficiency in the conversion of energy into work as a factor of economic growth. In Chapter 1 we propose an analitycal approach to economic equilibrium which takes into account for thermodynamic efficiency. Our idea is tho show that if irreversibility is present the classical economic equilibrium is changed, resulting in a more parsimonious use of energy. Standard economic equilibrium implies that the equilibrium itself remains unchanged if the numeraire adopted to price good is changed, i.e. all numeraires are equivalent. This is a strong and quite controversial result: it is intuitive that, being the conversion of energy into work intrinsically irreversible, energy is a special commodity and it is not equivalent to the other ones. Pricing in terms of energy should not be equivalent to pricing in terms of other goods, which in fact are obtained by energy itself. Moreover the proposed ā€thermodynamic-consistentā€ economy turns back into the classical one if the production process is reversible. In this sense economic theory implicitly assumes that all production processes are reversible. This assumption conflicts with any real world production process. The second line of research is independent from the first one and it is mainly devoted to the analysis of discontinuities of assets quoted in financial markets. Several drastic events are known to have influenced and changed the status of the financial markets. In such a situation the uncertainty hidden in financial assets increased rapidly getting the markets into a very turbulent state. Examples of such events are the 1929 crash of Wall Street, Black Monday crisis of 1987 and the 9/11 terrorist attack. In fact these are 1 2 rare events of very high intensity. Many discontinuities of smaller amplitude affect the behavior of assets: on the average we can identify, visually, 5āˆ’10 of such abrupt variations per year. Such kind of rapid and intense variations are usually referred as jumps. In this context we expect that, after a jump has occurred, the market switches in a new status characterized by an high level of volatility. As a consequence jumps are expected to have a predictive power on the future behaviour of assets. Despite this is a very intuitive fact it has not yet proved in the financial literature. A volatility forecasting model requires the definition of a volatility proxy. The idea is that proxies adopted in the literature for forecasting purposes are, in fact, contaminated by the presence of jumps in finite sample. In this context the forecasting power of jumps on future volatility cannot be revealed. In order to highlight such a feature it is needed a precise estimate of the jump component. In this spirit we propose in Chapter 3 a powerful jump separation technique and we test its performances on eight markets of electricity. The separation technique we adopt is taken from very recents results of the financial literature and only requires the introduction of a threshold. In Chapter 4 we construct precise volatility estimators using the threshold separation technique. This approach allows for a volatility estimation unaffected by jumps. Moreover in Chapter 5 we show that a jump purified estimate of volatility allow for a better investigation of its memory properties. Finally in Chapter 6 we construct a volatility forecasting model based on the proposed estimators. Being based on an accurate separation of continuous and discontinuous component of volatility, the model reveals the forecasting power of jumps on future volatility. Moreover we find that the forecasting power of jumps extend to at least one month. A dazzling example of the turbulence triggered by discontinuous variations is the recent crisis of markets. In September 2008 a global big crash has occurred in most part of stock exchanges. Since ever markets show an high level of volatility, characterized by large returns of both negative and postivie intensity. In this sense our results are very topical and constitute a basis for further investigations

    Statistical inferences for price staleness

    Get PDF
    This paper proposes a nonparametric theory for statistical inferences on zero returns of high-frequency asset prices. Using an infill asymptotic design, we derive limit theorems for the percentage of zero returns observed on a finite time interval and for other related quantities. Within this framework, we develop two nonparametric tests. First, we test whether intra-day zero returns are independent and identically distributed. Second, we test whether intra-day variation of the likelihood of occurrence of zero returns can be solely explained by a deterministic diurnal pattern. In an empirical application to ten representative stocks of the NYSE, we provide evidence that the null of independent and identically distributed intra-day zero returns can be conclusively rejected. We further find that a deterministic diurnal pattern is not sufficient to explain the intra-day variability of the distribution of zero returns

    Detecting correlations among functional-sequence motifs

    Get PDF
    Sequence motifs are words of nucleotides in DNA with biological functions, e.g., gene regulation. Identification of such words proceeds through rejection of Markov models on the expected motif frequency along the genome. Additional biological information can be extracted from the correlation structure among patterns of motif occurrences. In this paper a log-linear multivariate intensity Poisson model is estimated via expectation maximization on a set of motifs along the genome of E. coli K12. The proposed approach allows for excitatory as well as inhibitory interactions among motifs and between motifs and other genomic features like gene occurrences. Our findings confirm previous stylized facts about such types of interactions and shed new light on genome-maintenance functions of some particular motifs. We expect these methods to be applicable to a wider set of genomic features
    corecore